2 research outputs found

    Adapted Extended Baum-Welch transformations

    No full text
    The discrimination technique for estimating parameters of Gaussian mixtures that is based on the Extended Baum-Welch transformations (EBW) has had significant impact on the speech recognition community. \ud In this paper we introduce a general definition of a family of EBW transformations that can be associated with a weighted sum of updated and initial models. We compute a gradient steepness measurement for a family of EBW transformations that are applied to functions of Gaussian mixtures and demonstrate the growth property of these transformations. We consider EBW transformations of discriminative functions in which EBW controlled parameters are adapted to a gradient steepness measurement or to the likelihood of the data given the model. We present experimental results that show that adapted EBW transformations can significantly speed up estimating parameters of Gaussian mixtures and give better decoding results

    Generalization of Extended Baum-Welch Parameter Estimation for Discriminative Training and Decoding

    Get PDF
    We demonstrate the generalizability of the Extended Baum-Welch (EBW) algorithm not only for HMM parameter estimation but for decoding as well.\ud We show that there can exist a general function associated with the objective function under EBW that reduces to the well-known auxiliary function used in the Baum-Welch algorithm for maximum likelihood estimates.\ud We generalize representation for the updates of model parameters by making use of a differentiable function (such as arithmetic or geometric\ud mean) on the updated and current model parameters and describe their effect on the learning rate during HMM parameter estimation. Improvements on speech recognition tasks are also presented here
    corecore